![]() METHOD AND SYSTEM FOR LOCALIZATION AND MAPPING
专利摘要:
The invention relates to a method of locating and mapping used by a mobile machine in an environment, comprising the following steps: - determination (E34), on the basis of data received from a sensor embedded in the mobile machine, of the type an object in an area of the environment; - Implementation (E40) of a location algorithm using detection data, without taking into account the detection data relating to said area or said object when the determined type is a type of mobile object. Location and mapping systems are also described. 公开号:FR3025898A1 申请号:FR1402084 申请日:2014-09-17 公开日:2016-03-18 发明作者:Paulo Resende 申请人:Valeo Schalter und Sensoren GmbH; IPC主号:
专利说明:
[0001] FIELD OF THE INVENTION The present invention relates to locating and mapping techniques that can be used by a mobile machine in an environment containing moving objects. It relates more particularly to a method of locating and mapping and a location and mapping system. The invention is particularly advantageous in the case where certain objects present in the environment are fixed during the passage of the mobile machine equipped with the locating and mapping system, but may be moved later. BACKGROUND ART Locating and mapping methods used by a machine (for example a robot or a motor vehicle) mobile in an environment in order to build, on the basis of the information provided by one or more sensors (s) only, are known. ) boarded by the mobile gear, a map of the environment. Such methods are generally referred to as SLAM (Simultaneous Localization and Mapping). An example of such a method using a visual sensor (for example a video camera) is described in the patent application WO 2004/059 900. The location and mapping algorithms are generally designed to map only the fixed parts. of the environment and therefore do not memorize the position of the moving objects during the implementation of the algorithm, that is to say during the passage of the mobile machine near these objects. On the other hand, a problem arises for fixed objects during the passage of the mobile machine, but which can be moved at a later time and therefore do not really form part of the fixed environment that is to be mapped. In particular, during a subsequent passage in the region where the displaced object was located, the location and mapping algorithm will not recognize the previously mapped environment and restart the card construction process, which does not is of course not effective. OBJECT OF THE INVENTION In this context, the present invention proposes a locating and mapping method used by a mobile machine in an environment, comprising the following steps: determination, on the basis of data received from an on-board sensor; the mobile craft, of the type of an object located in an area of the environment; Implementation of a location algorithm using detection data, without taking into account the detection data relating to said zone or said object when the determined type is a type of mobile object. Thus, the location and mapping algorithm is implemented on the basis of environment components which are fixed in the long term. The map constructed by such a method is therefore more robust and can be easily re-used since the elements that compose it will all be present during a subsequent passage of the mobile machine in the same environment. According to other optional features, and therefore not limiting: the sensor is a lidar; said determination is carried out by pattern recognition or a signature in the received data; The sensor is an image sensor; said determination is carried out by means of pattern recognition 3025898 3 in at least one image represented by the received data; the detection data come from the on-board sensor; the detection data come from another sensor distinct from said onboard sensor; The location algorithm uses said object as a reference point if the determined type is a fixed object type; the location algorithm uses the detection data relating to a given area if no object located in said given area is detected with type corresponding to a type of moving object; the location algorithm used carries out the construction of a map of the environment, for example by searching for correspondence between a version of the map being constructed and scanning data provided by an onboard sensor and / or points of interest detected in an image provided by an onboard sensor, which also allows the location of the mobile machine in said card. The locating and mapping method may further comprise the following steps: - saving the constructed map; at a later time (for example when detecting a neighboring environment similar to that shown in the constructed map), loading and re-using the map constructed by the location algorithm. The invention also proposes a locating and mapping system for equipping a mobile machine in an environment, comprising a determination module, based on data received from a sensor embedded in the mobile machine, of the type of device. an object located in an area of the environment, and a locator module designed to locate the mobile machine according to detection data, without taking into account the detection data relating to said zone or said object 30 when the determined type is a type of mobile object. The optional features discussed above in terms of process can also be applied to such a system. DETAILED DESCRIPTION OF AN EXEMPLARY EMBODIMENT The following description with reference to the accompanying drawings, given by way of non-limiting example, will make it clear how the invention is and how it can be achieved. In the accompanying drawings: FIG. 1 represents a motor vehicle equipped with a locating and mapping system according to the invention; FIG. 2 represents an example of a particular context that the vehicle of FIG. 1 can meet; FIG. 3 schematically represents a first example of a location and mapping system according to the invention; FIG. 4 represents a data table used in the system of FIG. 3; FIG. 5 schematically represents a second example of a location and mapping system according to the invention; and FIG. 6 illustrates the main steps of a locating and mapping method according to the invention. FIG. 1 represents a motor vehicle V equipped with a locating and mapping system S. The locating and mapping system S is here implemented in the form of a microprocessor-based processing device. Such a processing device comprises at least one memory (for example a read-only memory or a rewritable non-volatile memory, and in general a random access memory) adapted to store computer program instructions, the execution of which by the microprocessor of the processing device causes the processing device to implement the methods and processes described below. The motor vehicle V comprises one or more onboard sensors 30, for example a visual sensor such as a CAM video camera and / or a distance sensor such as a laser remote sensor or lidar 3025898 5 (acronym of "light detection and ranging") LID. The locating and mapping system S receives the INFOcAm, INFOLic data generated by the onboard sensor (s) and processes them for the purpose of both constructing a C card of the environment in which the device evolves. vehicle V and obtain the location of the vehicle V in the card C built. FIG. 2 represents an example of the context that the vehicle V may encounter. In this example, the vehicle V moves in a street R with a double direction of circulation, lined on either side of the roadway by a sidewalk TR, then beyond the sidewalk TR by dwellings H. A third vehicle V 'is parked in that part of the street R situated at the front of vehicle V, straddling the roadway on rue R and on sidewalk TR. FIG. 3 diagrammatically represents a first example of a location and mapping system according to the invention. In this example, the location and mapping system S uses INFOCAM data, INFOLD delivered by two sensors (here CAM video camera and lidar LID). [0002] FIG. 3 shows functional modules which each correspond to a particular treatment carried out by the location and processing system S. In the example described here, the processes are carried out as already indicated due to the execution by the microprocessor of the system S, of computer program instructions stored in a memory of the system S. Alternatively, the processes performed by one or more functional module (s) could be implemented by a dedicated integrated circuit, for example an application specific integrated circuit (or ASIC for "Application Specific lntegrated Circuit"). The system of FIG. 3 comprises a detection module 10 which receives the INFOcAm data generated by a first sensor, in this case the CAM video camera, and generates, for each detected object OBJi, location information Li of the object. concerned. The location information Li is for example stored in a table TAB stored in the system memory S, as schematically represented in FIG. 4. In the example described here, the INFOcAm data represent images successively taken by the camera. CAM video; OBJ objects, are detected and located relative to the motor vehicle V by analysis of these images, as described for example in the patent application WO 2004/059 900 already mentioned in the introduction. In the context of FIG. 2, the detection module 10 detects for example the third vehicle V 'as object OBJ1 and determines its location (defined by the location information L1) with respect to the vehicle V by analyzing the images provided by the CAM video camera. The system of FIG. 3 also comprises a classification module 12 which receives as input the INFOcAm data generated by the first sensor (in this case the CAM video camera) and a designation of the detected objects OBJi (including for example their position in the image received from the CAM video camera). The classification module 12 is designed to identify the type T, of each object OBJi on the basis of the INFOcAm data received from the first sensor, for example, in the case described here where the INFOcAm data represent an image, by means of a shape recognition algorithm. As a variant, the first sensor could be the lidar LID, in which case the type identification of an object OBJi could be made for example on the basis of the signature of the signal received by the lidar LID by reflection on the OBJ object. . The identification of the type Ti of the object OBJi makes it possible to classify it among several types of objects (for example vehicle, pedestrian, cyclist, dwelling, lighting element or road signaling element, etc.) and thus to determine if this object OBJi has a mobile type or a fixed type. Note that the classification according to the type of object is performed 3025898 7 regardless of whether the object concerned is actually fixed or mobile during the passage of the vehicle V. For example, in the context of Figure 2, the classification module 12 determines, by form recognition, that the OBJ1 object 5 (that is to say the third vehicle V 'as explained above) is of the vehicle type. The type of object Ti is stored, in relation to the object concerned OBJ ,, in the aforementioned TAB table, as represented in FIG. 4. As a variant, the stored information could be limited to an indication of mobility or fixity. of the object concerned OBJ ,, indication determined on the basis of the type Ti identified as indicated above. For the sake of clarity, the detection module 10 and the classification module 12 have been described as two separate modules. It is conceivable, however, that the detection of an object OBJi and the identification of its type Ti (which allows its classification as a moving object or as a fixed object) are carried out during a same step, for example by means of a shape recognition algorithm in the images delivered by the CAM video camera. The system S comprises a filtering module 14 which receives the INFOLID data received from the second sensor, here the lidar LID. The filtering module 14 also uses the location Li of each object OBJ, detected by the detection module 10 and the type Ti of each object determined by the classification module 12 (this information can be received from the module concerned or read from the table TAB memorized). [0003] In the example described here, the INFOLID data delivered by the lidar represent for example a set of detection distance values d (a) associated respectively with angles α over the entire angular range 0 ° - 360 °. The filtering module 14 transmits, among the INFOLID data, only the INFOFIx data corresponding to zones for which no object has been detected or for which an object OBJi has been detected 3025898 8 with a type T, d. fixed object, according to the information generated by the detection and classification modules 10, 12 as described above. In other words, the filtering module 14 does not transmit the INFOLID data relating to areas for which an OBJ object has been detected with a T type of moving object. In the context of FIG. 2, the object OBJ1 (third vehicle V ') detected with a type T1 of moving object (vehicle) covers, according to the location information L1, the angular range al-a2 so that in the absence of another object identified with a type of moving object, the filtering module 14 transmits only the INFOFix data associated with the angular ranges [0 °, al [and] a2, 360 ° [(that is ie data representative of values d (a) only for 0 a <al and a2 <a <360 °). The data transmitted INFOFix, after filtering, by the filtering module 14 are received by a location module 16, which uses these INFOFix data for the implementation of a simultaneous localization and mapping algorithm, for example as described in the article "A real-time robust SLAM for large-scale outdoor environments", by J. Xie, F. Nashashibi, MN Parent, and O. Garcia-Favrot, in ITS World Congress, 2010. The location module 16 allows, as a function of the INFORx data 20 from the second sensor (in the example described, the lidar LID), here after filtering, and using a card C constructed by the location module 16 during the previous iterations, of a to determine the current position (or location) LOC of the vehicle V in the map C and secondly to enrich the map C, in particular thanks to the presence, among the 25 data INFOFix, data relating to areas not reached by the lidar LID when iterating previous events. Note however that, thanks to the rejection (by the filtering module 14) of the INFOLD data relating to areas where an object OBJ has been detected with a type T1 of mobile object, only data relating to the objects present in FIG. remain are processed by the location module 16, which avoids the processing of data actually useless (hence an acceleration of the treatment) and also allows the construction of a map without object can be moved later: such a card is more robust and easily reusable. Figure 5 schematically shows a second example of a location and mapping system according to the invention. In this example, the location and mapping system S uses the DAT data delivered by a single sensor, here the CAM video camera. As with FIG. 3, FIG. 5 shows functional modules which each correspond to a particular processing carried out by the location and processing system S, here because of the execution by the microprocessor of the system S of instructions. In a variant, the processes performed by one or more functional module (s) could be implemented by a dedicated integrated circuit, for example an application integrated circuit. specific (or ASIC for "Application Specific Integrated Circuit"). The system of FIG. 5 comprises a detection module 20 which receives the data DAT generated by the sensor, here data representative of images taken by the video camera CAM, and generates, for each detected object OBJ, (here by analysis of these images), location information L, of the object concerned. The location information L, for example, is stored in a table TAB stored in the system memory S, as schematically represented in FIG. 4. The system S of FIG. 5 comprises a classification module 22 which receives as input the DAT data generated by the sensor (here the CAM video camera) and a designation of OBJ objects; detected (including for example their position in the image received from the CAM video camera). The classification module 12 is designed to identify the type Ti of each object OBJ, based on the DAT data received from the sensor, for example, in the case described here where the INFOcAm data represent an image, by means of an algorithm form recognition. [0004] As already indicated with reference to FIG. 3, the identification of the type T of the object OBJ makes it possible to classify it among several types of object and thus to determine whether this object OBJ has a mobile type or a fixed type, regardless of whether the object concerned is actually fixed or mobile 5 when the vehicle V passes. The type of object T, is memorized, opposite the object concerned OBJ1, in the aforementioned TAB table, as represented in FIG. 4. As a variant, the stored information could be limited to an indication of mobility or fixity of the type of object T; recognized for the object concerned OBJ ,. [0005] As already indicated with reference to FIG. 3, it is conceivable that the detection of an object OBJ, and the identification of its type T, (which allows its classification as a moving object or as a fixed object) are performed during the same processing step (that is to say, by the same functional module). [0006] A location module 26 receives, for each object detected by the detection module 20, the designation of this object OBJ, its location and its type T, identified, and implements, on the basis of this information, an algorithm simultaneous mapping and locating, also using a C-card constructed in previous iterations of the algorithm. The C card includes for example a set of landmarks (or landmarks) each corresponding to an object detected during a previous iteration. The location module 26 is designed so that, during the processing it performs, only OBJ objects are considered, for which the associated type T does not correspond to a type of mobile object. For example, before taking into account the location information Li of an object OBJi in the simultaneous mapping and location algorithm, the location module 26 checks the type Ti of the object OBJi (here by consulting the table 30). TAB stored in the system memory S) and will actually use the location information Li in the algorithm only if the type Ti is a fixed object type, and not a type of moving object. On the basis of the location information L, objects whose type Ti corresponds to a fixed object type (but without taking into account the location information of objects whose type T corresponds to a type of mobile object ), the location module 26 determines the current position (or location) LOC of the vehicle V in the map C (typically by comparison of each object OBJ, detected with the reference points included in the map C) and enriches the map C ( typically by adding to the C-card the detected OBJ objects which do not correspond to any cue point so that they each form a new cue point in the completed C-card). As already indicated with respect to the first example of a location and mapping system, the constructed map C is robust and easily reusable because it is constructed on the basis of objects that are not likely to be moved. FIG. 6 illustrates the main steps of a locating and mapping method according to the invention. This method starts with a step E30 of receiving data generated by a sensor embedded in the vehicle V, here the CAM video camera 20 or lidar LID. The method continues with a step E32 for detecting objects present in the environment in which the vehicle V is moving, by analyzing the data received from the on-board sensor in step E30. This step is carried out in the examples described above by the detection module 10, 20. The method then comprises a determination step E34 for each object detected in step E32 and on the basis of the data received from FIG. embedded sensor at step E30, of the type of the object concerned, for example by means of a shape recognition algorithm (when the data received from the onboard sensor represent an image) or by means of a recognition algorithm signature (when the data received from the onboard sensor 3025898 represent a signal). This step is carried out in the examples described above by the classification module 12, 22. It should be noted that the data from several sensors could alternatively be used to classify the objects according to their type, possibly after a step of merging data from different sensors. When two sensors are used respectively for the classification of the objects and for the location of the mobile machine, as in the first example given above with reference to FIG. 3, the method 10 comprises a step E36 of receiving data generated by the second sensor. The method may then possibly include a step E38 of filtering the data received in step E36 in order to reject the data relating to the objects whose type determined in step E34 is a type of mobile object, or relating to zones of the environment where an object has been detected with a type (determined in step E34) corresponding to a type of moving object. The filtering module 14 used in the first example described above with reference to FIG. 3 implements such a step. Alternatively, as in the case of the second example described with reference to FIG. 5, the method does not specifically include a filtering step; the locating step described below is in this case designed to operate without taking into account data relating to objects whose type determined in step E34 is a type of moving object, or relating to areas of the environment where an object has been detected with a type (determined in step E34) corresponding to a type of moving object. The method continues with a step E40 of locating the mobile machine (here the motor vehicle V) on the basis of detection data, which can be the data received in step E30 and / or the data received at the step E36 (when such a step is implemented), and on the basis of a map constructed during prior iterations of the method. This step E40 comprises the implementation of a simultaneous localization and mapping algorithm, which allows not only the location of the machine but also the enrichment of the card. Note that, as indicated above, the data used by the location and mapping algorithm can come from several embedded sensors, possibly after a step of merging the data from the different sensors. It is then generally expected that the process will loop in step E30 for the implementation, at a later time, of a new iteration of steps E30 to E40. [0007] The card constructed during the process just described is permanently saved so that it can be re-used later, for example during the passage of the mobile machine (here the motor vehicle V) in the environment to a later moment (eg a different day after the day the map was built). [0008] For example, the location and mapping system S integrates a mechanism for comparing the map being constructed with the maps previously constructed (and stored) in order to re-use them. Thus, when the mobile machine is circulating again in the same environment at said subsequent time, the comparison mechanism makes it possible to recognize the surrounding environment as that represented in the previously constructed map and to use the previously constructed map (by loading this map in memory and using it in the location and mapping algorithm). The comparison mechanism works particularly well when the stored map has been constructed by the method described above, because such a map contains only information relating to objects which remain fixed and does not contain information relating to to objects that will no longer be present during the subsequent passage. The invention therefore allows a mapping that can be used in the long term for locating the mobile machine.
权利要求:
Claims (12) [0001] REVENDICATIONS1. Method of locating and mapping used by a mobile vehicle (V) in an environment, comprising the following steps: determination (E34), based on data (INFOcAm; DAT) received from a sensor (CAM) embedded in the mobile machine (V) of the type (T1) of an object (OBJ) located in an area of the environment; implementation (E40) of a location algorithm using detection data (INFOLID; DAT), without taking into account the detection data (INFOLID; DAT) relating to said zone or said object (OBJI) when the determined type (T,) is a type of moving object. [0002] The method of claim 1, wherein the sensor (LID) is a lidar. [0003] 3. The method of claim 2, wherein said determining is performed by pattern recognition or signature in the received data. [0004] The method of claim 1, wherein the sensor (CAM) is an image sensor. 25 [0005] 5. The method of claim 4, wherein said determination is performed by pattern recognition in at least one image represented by the received data (INFOcAm). [0006] 6. Method according to one of claims 1 to 5, wherein the detection data (DAT) is derived from the on-board sensor (CAM). 3025898 15 [0007] 7. Method according to one of claims 1 to 5, wherein the detection data (INFOLID) are from another sensor (LID) separate from said onboard sensor (CAM). 5 [0008] 8. Method according to one of claims 1 to 7, wherein the location algorithm uses said object (OBJ) as a reference point if the determined type (Ti) is a fixed object type. [0009] 9. Method according to one of claims 1 to 7, wherein the location algorithm uses the detection data (INFOLID) relating to a given area if no object located in said given area is detected with type corresponding to a type of mobile object. [0010] 10. Method according to one of claims 1 to 9, wherein the location algorithm implemented carries out the construction of a map (C) of the environment. [0011] The method of claim 10, comprising the steps of: saving the constructed map (C); at a later time, loading and re-use of the constructed map (C) by the location algorithm. [0012] 12. Locating and mapping system (S) for equipping a moving vehicle (V) in an environment, comprising: a determination module (12; 22), based on data (INFODAm; DAT) received from a sensor (CAM) embedded in the mobile machine, of the type (T,) of an object (OBJ) located in an area of the environment; A locator module (16; 26) adapted to locate the mobile machine (V) as a function of detection data (INFOLID; DAT), without taking into account the detection data relating to said zone or said object ( OBJ;) when the determined type (Ti) is a type of moving object.
类似技术:
公开号 | 公开日 | 专利标题 EP3195077B1|2021-04-14|Localisation and mapping method and system EP3332352B1|2019-09-04|Device and method for detecting a parking space that is available for a motor vehicle JP6358552B2|2018-07-18|Image recognition apparatus and image recognition method FR3054673B1|2019-06-14|MERGING DETECTION DATA AND FOLLOWING OBJECTS FOR MOTOR VEHICLE FR3056531A1|2018-03-30|OBSTACLE DETECTION FOR MOTOR VEHICLE FR2958774A1|2011-10-14|Method for detecting object e.g. obstacle, around lorry from stereoscopic camera, involves classifying object from one image, and positioning object in space around vehicle by projection of object in focal plane of stereoscopic camera FR3083352A1|2020-01-03|METHOD AND DEVICE FOR FAST DETECTION OF REPETITIVE STRUCTURES IN THE IMAGE OF A ROAD SCENE FR3060738A1|2018-06-22|METHOD AND DEVICE FOR REMOVING AT LEAST ONE POSITION OF A GROUND MARK IN A RADAR CARD KR20170119167A|2017-10-26|System and method for detecting object WO2018041978A1|2018-03-08|Device for determining a speed limit, on-board system comprising such a device, and method for determining a speed limit FR3056530A1|2018-03-30|OBSTRUCTION DETECTION BY FUSION OF OBJECTS FOR MOTOR VEHICLE FR3047217B1|2019-08-16|DEVICE FOR DETERMINING THE STATE OF A SIGNALING LIGHT, EMBEDY SYSTEM COMPRISING SUCH A DEVICE, VEHICLE COMPRISING SUCH A SYSTEM AND METHOD OF DETERMINING THE SAME EP3931741A1|2022-01-05|Vehicle driving assistance by reliable determination of objects in deformed images FR3027432A1|2016-04-22|DISTANCE ESTIMATION OF A PIETON BY AN IMAGING SYSTEM ON A MOTOR VEHICLE KR101608359B1|2016-04-01|Apparatus and method of processing the shadow portion in image FR3069362A1|2019-01-25|METHOD AND DEVICE FOR REDUCING THE NUMBER OF CANDIDATES TO DETECT AN OBJECT RECOGNITION METHOD WO2018115321A1|2018-06-28|System and method for constructing a virtual horizon WO2018069060A1|2018-04-19|Locating device and device for producing integrity data FR3082485A1|2019-12-20|DISPLAY DEVICE FOR ASSISTING THE DRIVING OF A DRIVER OF A VEHICLE WO2019077010A1|2019-04-25|Data processing method and associated onboard system WO2017211837A1|2017-12-14|On-board system and method for determining a relative position FR3080346A1|2019-10-25|METHOD FOR SELECTING A MOTOR VEHICLE FROM A PREFERRED CIRCULATION PATHWAY THROUGH A TIRE AREA FR3080177A1|2019-10-18|SECURING AUTONOMOUS DRIVING CARTOGRAPHY EP3704625A1|2020-09-09|Method of processing data for system for aiding the driving of a vehicle and associated system for aiding driving FR3061955A1|2018-07-20|SYSTEM AND METHOD FOR DETECTING A EVOLUTION CONTEXT OF A VEHICLE
同族专利:
公开号 | 公开日 US20170254651A1|2017-09-07| JP2017538915A|2017-12-28| CN107003671B|2021-04-16| FR3025898B1|2020-02-07| WO2016042106A1|2016-03-24| EP3195077B1|2021-04-14| EP3195077A1|2017-07-26| CN107003671A|2017-08-01| JP6695866B2|2020-05-20|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题 WO2018069060A1|2016-10-13|2018-04-19|Valeo Schalter Und Sensoren Gmbh|Locating device and device for producing integrity data| WO2018085089A1|2016-11-01|2018-05-11|Google Llc|Map summarization and localization|JPH07191743A|1993-12-27|1995-07-28|Toshiba Corp|Moving route generation method| DE19749086C1|1997-11-06|1999-08-12|Daimler Chrysler Ag|Device for determining data indicating the course of the lane| JP4978099B2|2006-08-03|2012-07-18|トヨタ自動車株式会社|Self-position estimation device| CN100483283C|2007-08-01|2009-04-29|暨南大学|Two-dimensional positioning device based on machine vision| KR101572851B1|2008-12-22|2015-11-30|삼성전자 주식회사|Method for building map of mobile platform in dynamic environment| JP5141644B2|2009-06-23|2013-02-13|トヨタ自動車株式会社|Autonomous mobile body, self-position estimation apparatus, and program| CN101839722A|2010-05-06|2010-09-22|南京航空航天大学|Method for automatically recognizing target at medium and low altitudes and positioning carrier with high accuracy| CN102591332B|2011-01-13|2014-08-13|同济大学|Device and method for local path planning of pilotless automobile| CN202033665U|2011-04-12|2011-11-09|中国科学院沈阳自动化研究所|Rail type autonomous mobile robot| WO2013076829A1|2011-11-22|2013-05-30|株式会社日立製作所|Autonomous mobile system| JP5429901B2|2012-02-08|2014-02-26|富士ソフト株式会社|Robot and information processing apparatus program| JP5817611B2|2012-03-23|2015-11-18|トヨタ自動車株式会社|Mobile robot| AU2012376428B2|2012-04-05|2015-06-25|Hitachi, Ltd.|Map data creation device, autonomous movement system and autonomous movement control device| CN102968121B|2012-11-27|2015-04-08|福建省电力有限公司|Precise track traveling positioning device| JP2014203429A|2013-04-10|2014-10-27|トヨタ自動車株式会社|Map generation apparatus, map generation method, and control program|DE102017215868A1|2017-09-08|2019-03-14|Robert Bosch Gmbh|Method and device for creating a map| DE102018208182A1|2018-05-24|2019-11-28|Robert Bosch Gmbh|Method and device for carrying out at least one safety-enhancing measure for a vehicle| DE102019220616A1|2019-12-30|2021-07-01|Automotive Research & Testing Center|METHOD OF SIMULTANEOUS LOCALIZATION AND IMAGING|
法律状态:
2015-09-30| PLFP| Fee payment|Year of fee payment: 2 | 2016-03-18| PLSC| Search report ready|Effective date: 20160318 | 2016-09-28| PLFP| Fee payment|Year of fee payment: 3 | 2017-09-29| PLFP| Fee payment|Year of fee payment: 4 | 2018-09-28| PLFP| Fee payment|Year of fee payment: 5 | 2019-09-30| PLFP| Fee payment|Year of fee payment: 6 | 2020-09-30| PLFP| Fee payment|Year of fee payment: 7 | 2021-09-30| PLFP| Fee payment|Year of fee payment: 8 |
优先权:
[返回顶部]
申请号 | 申请日 | 专利标题 FR1402084A|FR3025898B1|2014-09-17|2014-09-17|LOCATION AND MAPPING METHOD AND SYSTEM| FR1402084|2014-09-17|FR1402084A| FR3025898B1|2014-09-17|2014-09-17|LOCATION AND MAPPING METHOD AND SYSTEM| US15/510,374| US20170254651A1|2014-09-17|2015-09-17|Localization and mapping method and system| CN201580050450.4A| CN107003671B|2014-09-17|2015-09-17|Positioning and mapping method and system| EP15763930.3A| EP3195077B1|2014-09-17|2015-09-17|Localisation and mapping method and system| PCT/EP2015/071378| WO2016042106A1|2014-09-17|2015-09-17|Localisation and mapping method and system| JP2017514865A| JP6695866B2|2014-09-17|2015-09-17|Positioning / mapping method and positioning / mapping system| 相关专利
Sulfonates, polymers, resist compositions and patterning process
Washing machine
Washing machine
Device for fixture finishing and tension adjusting of membrane
Structure for Equipping Band in a Plane Cathode Ray Tube
Process for preparation of 7 alpha-carboxyl 9, 11-epoxy steroids and intermediates useful therein an
国家/地区
|